73 research outputs found
Von Neumann Entropy Penalization and Low Rank Matrix Estimation
A problem of statistical estimation of a Hermitian nonnegatively definite
matrix of unit trace (for instance, a density matrix in quantum state
tomography) is studied. The approach is based on penalized least squares method
with a complexity penalty defined in terms of von Neumann entropy. A number of
oracle inequalities have been proved showing how the error of the estimator
depends on the rank and other characteristics of the oracles. The methods of
proofs are based on empirical processes theory and probabilistic inequalities
for random matrices, in particular, noncommutative versions of Bernstein
inequality
Asymptotics and Concentration Bounds for Bilinear Forms of Spectral Projectors of Sample Covariance
Let be i.i.d. Gaussian random variables with zero mean and
covariance operator taking values in a
separable Hilbert space Let be the effective rank of being the trace of and being its
operator norm. Let be
the sample (empirical) covariance operator based on The
paper deals with a problem of estimation of spectral projectors of the
covariance operator by their empirical counterparts, the spectral
projectors of (empirical spectral projectors). The focus is on
the problems where both the sample size and the effective rank are large. This framework includes and generalizes well known
high-dimensional spiked covariance models. Given a spectral projector
corresponding to an eigenvalue of covariance operator and its
empirical counterpart we derive sharp concentration bounds for
bilinear forms of empirical spectral projector in terms of sample
size and effective dimension Building upon these
concentration bounds, we prove the asymptotic normality of bilinear forms of
random operators under the assumptions that
and In a special case of eigenvalues of
multiplicity one, these results are rephrased as concentration bounds and
asymptotic normality for linear forms of empirical eigenvectors. Other results
include bounds on the bias and a method of bias
reduction as well as a discussion of possible applications to statistical
inference in high-dimensional principal component analysis
- …